Skip to content

Fix for Issue #205#207

Open
Raiden129 wants to merge 2 commits intonikopueringer:mainfrom
ExperimentationT:main
Open

Fix for Issue #205#207
Raiden129 wants to merge 2 commits intonikopueringer:mainfrom
ExperimentationT:main

Conversation

@Raiden129
Copy link

Prevent _prompt_inference_settings() from raising UnboundLocalError when the resolved backend is not torch (for MLX on Apple Silicon).

The function always returned InferenceSettings, but generate_comp and gpu_post_processing were only assigned inside the torch-specific branch. On MLX this left both locals undefined and caused inference to crash before startup.

Initialize those fields from InferenceSettings defaults before the backend check, then continue overriding them only for the torch path.

prompt helper returns valid settings instead of crashing.

What does this change?

  • Initializes generate_comp and gpu_post_processing before backend-specific prompting.

  • Preserves existing torch behavior: torch users are still prompted for composition preview and GPU post-processing settings.

  • Makes non-torch backend MLX fall back to the InferenceSettings defaults instead of crashing.

Checklist

  • uv run pytest passes
  • uv run ruff check passes
  • uv run ruff format --check passes

Raiden129 and others added 2 commits March 26, 2026 22:02
Prevent `_prompt_inference_settings()` from raising `UnboundLocalError`
when the resolved backend is not `torch` (for example MLX on Apple Silicon).

The function always returned `InferenceSettings`, but `generate_comp` and `gpu_post_processing` were only assigned inside the torch-specific branch. On MLX this left both locals undefined and caused inference to crash before startup.

Initialize those fields from `InferenceSettings` defaults before the
backend check, then continue overriding them only for the torch path.

prompt helper returns valid settings instead of crashing.
@raybrownco
Copy link

Thanks for the PR, @Raiden129 - my team's running into this problem at the moment and I was just about to dig into a fix myself. Looks like I don't have to. Kudos!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants